A globally convergent BFGS method for symmetric nonlinear equations
نویسندگان
چکیده
منابع مشابه
A globally convergent BFGS method for nonlinear monotone equations without any merit functions
Since 1965, there has been significant progress in the theoretical study on quasi-Newton methods for solving nonlinear equations, especially in the local convergence analysis. However, the study on global convergence of quasi-Newton methods is relatively fewer, especially for the BFGS method. To ensure global convergence, some merit function such as the squared norm merit function is typically ...
متن کاملA new backtracking inexact BFGS method for symmetric nonlinear equations
A BFGS method, in association with a new backtracking line search technique, is presented for solving symmetric nonlinear equations. The global and superlinear convergences of the given method are established under mild conditions. Preliminary numerical results show that the proposed method is better than the normal technique for the given problems. c © 2007 Elsevier Ltd. All rights reserved.
متن کاملBFGS trust-region method for symmetric nonlinear equations
In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier's archiving and manuscript policies are encouraged to visit: a b s t r a c t In this paper, we propose a BFGS trust-region method for solving symmetric nonlinear equations. The global c...
متن کاملA Trust-Region-Based BFGS Method with Line Search Technique for Symmetric Nonlinear Equations
A trust-region-based BFGS method is proposed for solving symmetric nonlinear equations. In this given algorithm, if the trial step is unsuccessful, the linesearch technique will be used instead of repeatedly solving the subproblem of the normal trust-region method. We establish the global and superlinear convergence of the method under suitable conditions. Numerical results show that the given ...
متن کاملA Globally Convergent Linearly Constrained Lagrangian Method for Nonlinear Optimization
For optimization problems with nonlinear constraints, linearly constrained Lagrangian (LCL) methods sequentially minimize a Lagrangian function subject to linearized constraints. These methods converge rapidly near a solution but may not be reliable from arbitrary starting points. The well known example MINOS has proven effective on many large problems. Its success motivates us to propose a glo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Industrial & Management Optimization
سال: 2021
ISSN: 1553-166X
DOI: 10.3934/jimo.2021020